22,589 research outputs found

    On the support of measures in multiplicative free convolution semigroups

    Full text link
    In this paper, we study the supports of measures in multiplicative free semigroups on the positive real line and on the unit circle. We provide formulas for the density of the absolutely continuous parts of measures in these semigroups. The descriptions rely on the characterizations of the images of the upper half-plane and the unit disc under certain subordination functions. These subordination functions are η\eta-transforms of infinitely divisible measures with respect to multiplicative free convolution. The characterizations also help us study the regularity properties of these measures. One of the main results is that the number of components in the support of measures in the semigroups is a decreasing function of the semigroup parameter

    Large Margin Neural Language Model

    Full text link
    We propose a large margin criterion for training neural language models. Conventionally, neural language models are trained by minimizing perplexity (PPL) on grammatical sentences. However, we demonstrate that PPL may not be the best metric to optimize in some tasks, and further propose a large margin formulation. The proposed method aims to enlarge the margin between the "good" and "bad" sentences in a task-specific sense. It is trained end-to-end and can be widely applied to tasks that involve re-scoring of generated text. Compared with minimum-PPL training, our method gains up to 1.1 WER reduction for speech recognition and 1.0 BLEU increase for machine translation.Comment: 9 pages. Accepted as a long paper in EMNLP201
    • …
    corecore